
Research
Security News
Malicious PyPI Package Exploits Deezer API for Coordinated Music Piracy
Socket researchers uncovered a malicious PyPI package exploiting Deezer’s API to enable coordinated music piracy through API abuse and C2 server control.
@aws-cdk/aws-codepipeline-actions
Advanced tools
This is a developer preview (public beta) module. Releases might lack important features and might have future breaking changes.
This package contains Actions that can be used in a CodePipeline.
import codepipeline = require('@aws-cdk/aws-codepipeline');
import codepipeline_actions = require('@aws-cdk/aws-codepipeline-actions');
To use a CodeCommit Repository in a CodePipeline:
import codecommit = require('@aws-cdk/aws-codecommit');
const repo = new codecommit.Repository(this, 'Repo', {
// ...
});
const pipeline = new codepipeline.Pipeline(this, 'MyPipeline', {
pipelineName: 'MyPipeline',
});
const sourceOutput = new codepipeline.Artifact();
const sourceAction = new codepipeline_actions.CodeCommitSourceAction({
actionName: 'CodeCommit',
repository: repo,
output: sourceOutput,
});
pipeline.addStage({
stageName: 'Source',
actions: [sourceAction],
});
To use GitHub as the source of a CodePipeline:
// Read the secret from ParameterStore
const sourceOutput = new codepipeline.Artifact();
const sourceAction = new codepipeline_actions.GitHubSourceAction({
actionName: 'GitHub_Source',
owner: 'awslabs',
repo: 'aws-cdk',
oauthToken: cdk.SecretValue.secretsManager('my-github-token'),
output: sourceOutput,
branch: 'develop', // default: 'master'
trigger: codepipeline_actions.GitHubTrigger.POLL // default: 'WEBHOOK', 'NONE' is also possible for no Source trigger
});
pipeline.addStage({
stageName: 'Source',
actions: [sourceAction],
});
To use an S3 Bucket as a source in CodePipeline:
import s3 = require('@aws-cdk/aws-s3');
const sourceBucket = new s3.Bucket(this, 'MyBucket', {
versioned: true, // a Bucket used as a source in CodePipeline must be versioned
});
const pipeline = new codepipeline.Pipeline(this, 'MyPipeline');
const sourceOutput = new codepipeline.Artifact();
const sourceAction = new codepipeline_actions.S3SourceAction({
actionName: 'S3Source',
bucket: sourceBucket,
bucketKey: 'path/to/file.zip',
output: sourceOutput,
});
pipeline.addStage({
stageName: 'Source',
actions: [sourceAction],
});
By default, the Pipeline will poll the Bucket to detect changes.
You can change that behavior to use CloudWatch Events by setting the trigger
property to S3Trigger.EVENTS
(it's S3Trigger.POLL
by default).
If you do that, make sure the source Bucket is part of an AWS CloudTrail Trail -
otherwise, the CloudWatch Events will not be emitted,
and your Pipeline will not react to changes in the Bucket.
You can do it through the CDK:
import cloudtrail = require('@aws-cdk/aws-cloudtrail');
const key = 'some/key.zip';
const trail = new cloudtrail.Trail(this, 'CloudTrail');
trail.addS3EventSelector([sourceBucket.arnForObjects(key)], {
readWriteType: cloudtrail.ReadWriteType.WriteOnly,
});
const sourceAction = new codepipeline_actions.S3SourceAction({
actionName: 'S3Source',
bucketKey: key,
bucket: sourceBucket,
output: sourceOutput,
trigger: codepipeline_actions.S3Trigger.EVENTS, // default: S3Trigger.POLL
});
To use an ECR Repository as a source in a Pipeline:
import ecr = require('@aws-cdk/aws-ecr');
const pipeline = new codepipeline.Pipeline(this, 'MyPipeline');
const sourceOutput = new codepipeline.Artifact();
const sourceAction = new codepipeline_actions.EcrSourceAction({
actionName: 'ECR',
repository: ecrRepository,
imageTag: 'some-tag', // optional, default: 'latest'
output: sourceOutput,
});
pipeline.addStage({
stageName: 'Source',
actions: [sourceAction],
});
Example of a CodeBuild Project used in a Pipeline, alongside CodeCommit:
import codebuild = require('@aws-cdk/aws-codebuild');
import codecommit = require('@aws-cdk/aws-codecommit');
const repository = new codecommit.Repository(this, 'MyRepository', {
repositoryName: 'MyRepository',
});
const project = new codebuild.PipelineProject(this, 'MyProject');
const sourceOutput = new codepipeline.Artifact();
const sourceAction = new codepipeline_actions.CodeCommitSourceAction({
actionName: 'CodeCommit',
repository,
output: sourceOutput,
});
const buildAction = new codepipeline_actions.CodeBuildAction({
actionName: 'CodeBuild',
project,
input: sourceOutput,
outputs: [new codepipeline.Artifact()], // optional
});
new codepipeline.Pipeline(this, 'MyPipeline', {
stages: [
{
stageName: 'Source',
actions: [sourceAction],
},
{
stageName: 'Build',
actions: [buildAction],
},
],
});
The default category of the CodeBuild Action is Build
;
if you want a Test
Action instead,
override the type
property:
const testAction = new codepipeline_actions.CodeBuildAction({
actionName: 'IntegrationTest',
project,
input: sourceOutput,
type: codepipeline_actions.CodeBuildActionType.TEST, // default is BUILD
});
When you want to have multiple inputs and/or outputs for a Project used in a
Pipeline, instead of using the secondarySources
and secondaryArtifacts
properties of the Project
class, you need to use the extraInputs
and
extraOutputs
properties of the CodeBuild CodePipeline
Actions. Example:
const sourceOutput1 = new codepipeline.Artifact();
const sourceAction1 = new codepipeline_actions.CodeCommitSourceAction({
actionName: 'Source1',
repository: repository1,
output: sourceOutput1,
});
const sourceOutput2 = new codepipeline.Artifact('source2');
const sourceAction2 = new codepipeline_actions.CodeCommitSourceAction({
actionName: 'Source2',
repository: repository2,
output: sourceOutput2,
});
const buildAction = new codepipeline_actions.CodeBuildAction({
actionName: 'Build',
project,
input: sourceOutput1,
extraInputs: [
sourceOutput2, // this is where 'source2' comes from
],
outputs: [
new codepipeline.Artifact('artifact1'), // for better buildspec readability - see below
new codepipeline.Artifact('artifact2'),
],
});
Note: when a CodeBuild Action in a Pipeline has more than one output, it
only uses the secondary-artifacts
field of the buildspec, never the
primary output specification directly under artifacts
. Because of that, it
pays to explicitly name all output artifacts of that Action, like we did
above, so that you know what name to use in the buildspec.
Example buildspec for the above project:
const project = new codebuild.PipelineProject(this, 'MyProject', {
buildSpec: {
version: '0.2',
phases: {
build: {
commands: [
// By default, you're in a directory with the contents of the repository from sourceAction1.
// Use the CODEBUILD_SRC_DIR_source2 environment variable
// to get a path to the directory with the contents of the second input repository.
],
},
},
artifacts: {
'secondary-artifacts': {
'artifact1': {
// primary Action output artifact,
// available as buildAction.outputArtifact
},
'artifact2': {
// additional output artifact,
// available as buildAction.additionalOutputArtifact('artifact2')
},
},
},
},
// ...
});
In order to use Jenkins Actions in the Pipeline,
you first need to create a JenkinsProvider
:
const jenkinsProvider = new codepipeline_actions.JenkinsProvider(this, 'JenkinsProvider', {
providerName: 'MyJenkinsProvider',
serverUrl: 'http://my-jenkins.com:8080',
version: '2', // optional, default: '1'
});
If you've registered a Jenkins provider in a different CDK app, or outside the CDK (in the CodePipeline AWS Console, for example), you can import it:
const jenkinsProvider = codepipeline_actions.JenkinsProvider.import(this, 'JenkinsProvider', {
providerName: 'MyJenkinsProvider',
serverUrl: 'http://my-jenkins.com:8080',
version: '2', // optional, default: '1'
});
Note that a Jenkins provider (identified by the provider name-category(build/test)-version tuple) must always be registered in the given account, in the given AWS region, before it can be used in CodePipeline.
With a JenkinsProvider
,
we can create a Jenkins Action:
const buildAction = new codepipeline_actions.JenkinsAction({
actionName: 'JenkinsBuild',
jenkinsProvider: jenkinsProvider,
projectName: 'MyProject',
type: ccodepipeline_actions.JenkinsActionType.BUILD,
});
This module contains Actions that allows you to deploy to CloudFormation from AWS CodePipeline.
For example, the following code fragment defines a pipeline that automatically deploys a CloudFormation template directly from a CodeCommit repository, with a manual approval step in between to confirm the changes:
example Pipeline to deploy CloudFormation
See the AWS documentation for more details about using CloudFormation in CodePipeline.
This package defines the following actions:
replaceOnFailure
is set to true
, in which case it will be destroyed and recreated).If you want to deploy your Lambda through CodePipeline,
and you don't use assets (for example, because your CDK code and Lambda code are separate),
you can use a special Lambda Code
class, CfnParametersCode
.
Note that your Lambda must be in a different Stack than your Pipeline.
The Lambda itself will be deployed, alongside the entire Stack it belongs to,
using a CloudFormation CodePipeline Action. Example:
Example of deploying a Lambda through CodePipeline
To use CodeDeploy for EC2/on-premise deployments in a Pipeline:
import codedeploy = require('@aws-cdk/aws-codedeploy');
const pipeline = new codepipeline.Pipeline(this, 'MyPipeline', {
pipelineName: 'MyPipeline',
});
// add the source and build Stages to the Pipeline...
const deployAction = new codepipeline_actions.CodeDeployServerDeployAction({
actionName: 'CodeDeploy',
input: buildOutput,
deploymentGroup,
});
pipeline.addStage({
stageName: 'Deploy',
actions: [deployAction],
});
To use CodeDeploy for blue-green Lambda deployments in a Pipeline:
const lambdaCode = lambda.Code.cfnParameters();
const func = new lambda.Function(lambdaStack, 'Lambda', {
code: lambdaCode,
handler: 'index.handler',
runtime: lambda.Runtime.Nodejs810,
});
// used to make sure each CDK synthesis produces a different Version
const version = func.newVersion();
const alias = new lambda.Alias(lambdaStack, 'LambdaAlias', {
aliasName: 'Prod',
version,
});
new codedeploy.LambdaDeploymentGroup(lambdaStack, 'DeploymentGroup', {
alias,
deploymentConfig: codedeploy.LambdaDeploymentConfig.Linear10PercentEvery1Minute,
});
Then, you need to create your Pipeline Stack,
where you will define your Pipeline,
and deploy the lambdaStack
using a CloudFormation CodePipeline Action
(see above for a complete example).
CodePipeline can deploy an ECS service. The deploy Action receives one input Artifact which contains the image definition file:
const deployStage = pipeline.addStage({
stageName: 'Deploy',
actions: [
new codepipeline_actions.EcsDeployAction({
actionName: 'DeployAction',
service,
// if your file is called imagedefinitions.json,
// use the `input` property,
// and leave out the `imageFile` property
input: buildOutput,
// if your file name is _not_ imagedefinitions.json,
// use the `imageFile` property,
// and leave out the `input` property
imageFile: buildOutput.atPath('imageDef.json'),
}),
],
});
To use an S3 Bucket as a deployment target in CodePipeline:
const targetBucket = new s3.Bucket(this, 'MyBucket', {});
const pipeline = new codepipeline.Pipeline(this, 'MyPipeline');
const deployAction = new codepipeline_actions.S3DeployAction({
actionName: 'S3Deploy',
stage: deployStage,
bucket: targetBucket,
input: sourceOutput,
});
const deployStage = pipeline.addStage({
stageName: 'Deploy',
actions: [deployAction],
});
You can deploy to Alexa using CodePipeline with the following Action:
// Read the secrets from ParameterStore
const clientId = new cdk.SecretParameter(this, 'AlexaClientId', { ssmParameter: '/Alexa/ClientId' });
const clientSecret = new cdk.SecretParameter(this, 'AlexaClientSecret', { ssmParameter: '/Alexa/ClientSecret' });
const refreshToken = new cdk.SecretParameter(this, 'AlexaRefreshToken', { ssmParameter: '/Alexa/RefreshToken' });
// Add deploy action
new codepipeline_actions.AlexaSkillDeployAction({
actionName: 'DeploySkill',
runOrder: 1,
input: sourceOutput,
clientId: clientId.value,
clientSecret: clientSecret.value,
refreshToken: refreshToken.value,
skillId: 'amzn1.ask.skill.12345678-1234-1234-1234-123456789012',
});
If you need manifest overrides you can specify them as parameterOverridesArtifact
in the action:
const cloudformation = require('@aws-cdk/aws-cloudformation');
// Deploy some CFN change set and store output
const executeOutput = new codepipeline.Artifact('CloudFormation');
const executeChangeSetAction = new codepipeline_actions.CloudFormationExecuteChangeSetAction({
actionName: 'ExecuteChangesTest',
runOrder: 2,
stackName,
changeSetName,
outputFileName: 'overrides.json',
output: executeOutput,
});
// Provide CFN output as manifest overrides
new codepipeline_actions.AlexaSkillDeployAction({
actionName: 'DeploySkill',
runOrder: 1,
input: sourceOutput,
parameterOverridesArtifact: executeOutput,
clientId: clientId.value,
clientSecret: clientSecret.value,
refreshToken: refreshToken.value,
skillId: 'amzn1.ask.skill.12345678-1234-1234-1234-123456789012',
});
This package contains an Action that stops the Pipeline until someone manually clicks the approve button:
const manualApprovalAction = new codepipeline.ManualApprovalAction({
actionName: 'Approve',
notificationTopic: new sns.Topic(this, 'Topic'), // optional
notifyEmails: [
'some_email@example.com',
], // optional
additionalInformation: 'additional info', // optional
});
approveStage.addAction(manualApprovalAction);
// `manualApprovalAction.notificationTopic` can be used to access the Topic
// after the Action has been added to a Pipeline
If the notificationTopic
has not been provided,
but notifyEmails
were,
a new SNS Topic will be created
(and accessible through the notificationTopic
property of the Action).
This module contains an Action that allows you to invoke a Lambda function in a Pipeline:
import lambda = require('@aws-cdk/aws-lambda');
const pipeline = new codepipeline.Pipeline(this, 'MyPipeline');
const lambdaAction = new codepipeline_actions.LambdaInvokeAction({
actionName: 'Lambda',
lambda: fn,
});
pipeline.addStage({
stageName: 'Lambda',
actions: [lambdaAction],
});
The Lambda Action can have up to 5 inputs, and up to 5 outputs:
const lambdaAction = new codepipeline_actions.LambdaInvokeAction({
actionName: 'Lambda',
inputs: [
sourceOutput,
buildOutput,
],
outputs: [
new codepipeline.Artifact('Out1'),
new codepipeline.Artifact('Out2'),
],
});
See the AWS documentation on how to write a Lambda function invoked from CodePipeline.
0.37.0 (2019-07-04)
construct.findChild()
now only looks up direct childrenPort.toRuleJSON
was renamed to toRuleJson
PipelineProject.addSecondaryArtifact
now returns void (formerly any)Project.addSecondaryArtifact
now returns void (formerly any)FAQs
Concrete Actions for AWS Code Pipeline
The npm package @aws-cdk/aws-codepipeline-actions receives a total of 18,318 weekly downloads. As such, @aws-cdk/aws-codepipeline-actions popularity was classified as popular.
We found that @aws-cdk/aws-codepipeline-actions demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 4 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
Socket researchers uncovered a malicious PyPI package exploiting Deezer’s API to enable coordinated music piracy through API abuse and C2 server control.
Research
The Socket Research Team discovered a malicious npm package, '@ton-wallet/create', stealing cryptocurrency wallet keys from developers and users in the TON ecosystem.
Security News
Newly introduced telemetry in devenv 1.4 sparked a backlash over privacy concerns, leading to the removal of its AI-powered feature after strong community pushback.